Morozov, Ivanov and Tikhonov Regularization Based LS-SVMs

نویسندگان

  • Kristiaan Pelckmans
  • Johan A. K. Suykens
  • Bart De Moor
چکیده

This paper contrasts three related regularization schemes for kernel machines using a least squares criterion, namely Tikhonov and Ivanov regularization and Morozov’s discrepancy principle. We derive the conditions for optimality in a least squares support vector machine context (LS-SVMs) where they differ in the role of the regularization parameter. In particular, the Ivanov and Morozov scheme express the trade-off between data-fitting and smoothness in the trust region of the parameters and the noise level respectively which both can be transformed uniquely to an appropriate regularization constant for a standard LS-SVM. This insight is employed to tune automatically the regularization constant in an LS-SVM framework based on the estimated noise level, which can be obtained by using a nonparametric technique as e.g. the differogram estimator.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Converged Algorithm for Tikhonov Regularized Nonnegative Matrix Factorization with Automatic Regularization Parameters Determination

We present a converged algorithm for Tikhonov regularized nonnegative matrix factorization (NMF). We specially choose this regularization because it is known that Tikhonov regularized least square (LS) is the more preferable form in solving linear inverse problems than the conventional LS. Because an NMF problem can be decomposed into LS subproblems, it can be expected that Tikhonov regularized...

متن کامل

Additive Regularization: Fusion of Training and Validation Levels in Kernel Methods

In this paper the training of Least Squares Support Vector Machines (LS-SVMs) for classification and regression and the determination of its regularization constants is reformulated in terms of additive regularization. In contrast with the classical Tikhonov scheme, a major advantage of this additive regularization mechanism is that it enables to achieve computational fusion of the training and...

متن کامل

Inverse Scale Space Theory for Inverse Problems

Abstract. In this paper we derive scale space methods for inverse problems which satisfy the fundamental axioms of fidelity and causality and we provide numerical illustrations of the use of such methods in deblurring. These scale space methods are asymptotic formulations of the Tikhonov-Morozov regularization method. The analysis and illustrations relate diffusion filtering methods in image pr...

متن کامل

Multi-parameter Tikhonov regularization and model function approach to the damped Morozov principle for choosing regularization parameters

In this paper, we study the multi-parameter Tikhonov regularization method which adds multiple different penalties to exhibit multi-scale features of the solution. An optimal error bound of the regularization solution is obtained by a priori choice of multiple regularization parameters. Some theoretical results of the regularization solution about the dependence on regularization parameters are...

متن کامل

Componentwise Least Squares Support Vector Machines

This chapter describes componentwise Least Squares Support Vector Machines (LS-SVMs) for the estimation of additive models consisting of a sum of nonlinear components. The primal-dual derivations characterizing LS-SVMs for the estimation of the additive model result in a single set of linear equations with size growing in the number of data-points. The derivation is elaborated for the classific...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004